A generalized conditional gradient method for nonlinear operator equations
نویسندگان
چکیده
The intention of this paper is to show the applicability of a generalized conditional gradient method for the minimization of Tikhonov-type functionals, which occur in the regularization of nonlinear inverse problems with sparsity constraints. We consider functionals of Tikhonov type where the usual quadratic penalty term is replaced by the pth power of a weighted p-norm. First of all, we analyze the convergence properties of a generalized gradient algorithm. Further, we show that the considered minimization method coincides under certain conditions with a surrogate method. Numerical results are then illustrated by means of the nonlinear single photon emission computed tomography problem.
منابع مشابه
A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations
Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...
متن کاملA new method for the generalized Hyers-Ulam-Rassias stability
We propose a new method, called the textit{the weighted space method}, for the study of the generalized Hyers-Ulam-Rassias stability. We use this method for a nonlinear functional equation, for Volterra and Fredholm integral operators.
متن کاملStochastic differential inclusions of semimonotone type in Hilbert spaces
In this paper, we study the existence of generalized solutions for the infinite dimensional nonlinear stochastic differential inclusions $dx(t) in F(t,x(t))dt +G(t,x(t))dW_t$ in which the multifunction $F$ is semimonotone and hemicontinuous and the operator-valued multifunction $G$ satisfies a Lipschitz condition. We define the It^{o} stochastic integral of operator set-valued stochastic pr...
متن کاملA New Strategy for Training RBF Network with Applications to Nonlinear Integral Equations
A new learning strategy is proposed for training of radial basis functions (RBF) network. We apply two different local optimization methods to update the output weights in training process, the gradient method and a combination of the gradient and Newton methods. Numerical results obtained in solving nonlinear integral equations show the excellent performance of the combined gradient method in ...
متن کاملAn accelerated gradient based iterative algorithm for solving systems of coupled generalized Sylvester-transpose matrix equations
In this paper, an accelerated gradient based iterative algorithm for solving systems of coupled generalized Sylvester-transpose matrix equations is proposed. The convergence analysis of the algorithm is investigated. We show that the proposed algorithm converges to the exact solution for any initial value under certain assumptions. Finally, some numerical examples are given to demons...
متن کامل